- A. Lakshmi Kanth
- M. Dheeraj
- K. Hanumantha Rao
- S. Rajendra Prasad
- P. V. Kumar
- A. Shiva Kumar
- Nagendla Swetha
- Vannoj Ravi Kumar
- Srikar
- G. V. N. Prasad
- V. Prema Tulasi
- B. Satish Kumar
- K. Ramu
- K. Satish Kumar
- K. Hanumanth Rao
- P. Nagaraj
- Arif Mohammad Abdul
- P. Mallesham
- K. Ashok Babu
- S. Megha Chandrika
- G. Nanda Gopal
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z All
Venkatesh Sharma, K.
- Role of Probabilistic Packet Marking Mechanism in Large Scale IP Trace Backs
Authors
1 CSE Dept, Sri Indu College of Engg & Tech, Hyd, IN
2 Sri Indu College of Engg & Tech , Hyd, IN
3 Sri Indu College of Engg & Tech, Hyd, IN
Source
Wireless Communication, Vol 2, No 9 (2010), Pagination: 257-266Abstract
An approach to IP trace back based on the probabilistic packet marking paradigm has been explained in this paper. Our approach, which we call randomize and link uses large checksum cords to link messages fragments in a way that is highly scalable, for the checksums serve both as associative addresses and data integrity verifiers. The main advantage of these checksum cords is that they spread the addresses of possible router messages across a spectrum that is too large for the attacker to easily create messages that collide with legitimate messages.
Keywords
Associate Addresses, Checksum Cords, Distributed Denial of Services (DDOS), IP, IP Spoofing, Probabilistic Packet Marking, Trace Back.- Rate Allocation and Network Lifetime Problems for Wireless Sensor Networks
Authors
1 CSE dept. at Sri Indu College of Engineering & Technology, IN
2 JNTu, Hyderabad, IN
3 Sri Indu College of Engineering & Technology, JNTU Hyderabad, IN
Source
Wireless Communication, Vol 2, No 9 (2010), Pagination: 304-317Abstract
An important performance consideration for wireless sensor networks is the amount of information collected by all the nodes in the network over the course of network lifetime. Since the objective of maximizing the sum of rates of all the nodes in the network can lead to a severe bias in rate allocation among the nodes, we advocate the use of lexicographical max-min (LMM) rate allocation. To calculate the LMM rate allocation vector, we develop a polynomial-time algorithm by exploiting the parametric analysis (PA) technique from linear program (LP), which we call serial LP with Parametric Analysis (SLP-PA). We show that the SLP-PA can be also employed to address the LMM node lifetime problem much more efficiently than a state-of-the-art algorithm proposed in the literature. More important, we show that there exists an elegant duality relationship between the LMM rate allocation problem and the LMM node lifetime problem. Therefore, it is sufficient to solve only one of the two problems. Important insights can be obtained by inferring duality results for the other problem.
Keywords
Energy Constraint, Flow Routing, Lexicographic Max-Min, Linear Programming, Network Capacity, Node Lifetime, Parametric Analysis, Rate Allocation, Sensor Networks, Theory.- Requirement Engineering Concepts in Risk Analysis
Authors
1 JNTU Kakinada, IN
2 Osmania University in CSE Department, IN
Source
Software Engineering, Vol 4, No 10 (2012), Pagination: 410-414Abstract
From the olden days to till date risk has always played a very important role and is therefore very much needed to be given importance. Risk refers to the protection of product from Interception, Fabrication, Modification and Interruption. Active attacks and passive attacks create awoke in final software product. The solution to this problem is to take care of risk from the initial stages of any product development. The first stage of any System development is the Requirement’s engineering process. In requirement engineering domain, Risk analysis, Risk management and Risk assessment is the most effective tool because it helps us to compare requirements and cost of risk measures. In this paper we have pointed out the need to introduce risk analysis issues in the requirement engineering process. The focus of this paper is to suggest some methods and tools which will understand risk from the early stages of information system development.Keywords
Risk, Threat, Requirement Engineering, Vulnerability.- A General Model for Sequential Pattern Mining with a Progressive Database
Authors
1 Sri Indu College of Engineering & Technology, IN
2 Nagarjuna Univesity, Guntur, IN
Source
Software Engineering, Vol 2, No 9 (2010), Pagination: 209-223Abstract
Although there have been many recent studies on the mining of sequential patterns in a static database and in a database with increasing data, these works, in general, do not fully explore the effect of deleting old data from the sequences in the database. When sequential patterns are generated, the newly arriving patterns may not be identified as frequent sequential patterns due to the existence of old data and sequences. Even worse, the obsolete sequential patterns that are not frequent recently may stay in the reported results. In practice, users are usually more interested in the recent data than the old ones. To capture the dynamic nature of data addition and deletion, we propose a general model of sequential pattern mining with a progressive database while the data in the database may be static, inserted, or deleted. In addition, we present a progressive algorithm Pisa, which stands for Progressive mining of Sequential patterns, to progressively discover sequential patterns in defined time period of interest (POI). The POI is a sliding window continuously advancing as the time goes by. Pisautilizes a progressive sequential tree to efficiently maintain the latest data sequences, discover the complete set of up-to-date sequential patterns, and delete obsolete data and patterns accordingly. The height of the sequential pattern tree proposed is bounded by the length of POI, thereby effectively limiting the memory space required by Pisathat is significantly smaller than the memory needed by the alternative method, Direct Appending (DirApp). Note that the sequential pattern mining with a static database and with an incremental database are special cases of the progressive sequential pattern mining. By changing Start time and End time of the POI, Pisacan easily deal with a static database or an incremental database as well. Complexity of algorithms proposed is analyzed. The experimental results show that Pisanot only significantly outperforms the prior methods in execution time by orders of magnitude but also possesses graceful scalability.Keywords
Progressive Sequential Pattern.- Integration Algorithms for Ranking Higher in Search Results
Authors
1 Sri Indu College of Engineering and Technology, IN
Source
Software Engineering, Vol 2, No 9 (2010), Pagination: 224-228Abstract
The result integration algorithm is the core of metesearch engine. This paper focuses on evaluating the integration algorithms of meta-search engine, and compares meta-search engine to the general search engine by experiments. An experiment method to determine the priority of participant of meta-search engine is also proposed. The experiment result proved that the meta-search engine could get higher quality searching result on average.
Keywords
Evaluation, Search Engine, Merge Algorithm.- Data Model Framework for Intruder Information Sharing in Sensor Networks
Authors
1 Sri Indu College of Engineering and Technology, IN
Source
Networking and Communication Engineering, Vol 2, No 9 (2010), Pagination: 291-300Abstract
In sensor networks, an intruder (i.e., compromised node) identified and isolated in one place can be relocated and/or duplicated to other places to continue attacks; hence, detection and isolation of the same intruder or its clones may have to be conducted repeatedly, wasting scarce network resources. Therefore, once an intruder is identified, it should be known to all innocent nodes such that the intruder or its clones can be recognized when appearing elsewhere. However, secure, efficient and scalable sharing of intruder information remains a challenging and unsolved problem. To address this problem, we propose a three-tier framework, consisting of a verifiable intruder reporting (VIR) scheme, a quorum based caching (QBC) scheme for efficiently propagating intruder reports to the whole network, and a collaborative Bloom Filter (CBF) scheme for handling intruder information locally. Extensive analysis and evaluations are also conducted to verify the efficiency and scalability of the proposed framework.Keywords
Network Security, Routing Alogrithms Intruder Information Caches, Dedicated Membership Servers, IP Spoofing.- Importance of IDPF to Avoid DDoS Attacks
Authors
1 Sri Indu College of Engineering & Technology, JNTU Hyderabad, IN
2 JNTU Hyderabad, IN
3 CSE Dept. at Sri Indu College of Engineering & Technology, IN
Source
Networking and Communication Engineering, Vol 2, No 9 (2010), Pagination: 308-318Abstract
The Distributed Denial-of-Service (DDoS) attack is a serious threat to the legitimate use of the Internet. Prevention mechanisms are thwarted by the ability of attackers to forge or spoof the source addresses in IP packets. By employing IP spoofing, attackers can evade detection and put a substantial burden on the destination network for policing attack packets. In this paper, we propose an interdomain packet filter (IDPF) architecture that can mitigate the level of IP spoofing on the Internet. A key feature of our scheme is that it does not require global routing information. IDPFs are constructed from the information implicit in Border Gateway Protocol (BGP) route updates and are deployed in network border routers. We establish the conditions under which the IDPF framework correctly works in that it does not discard packets with valid source addresses. Based on extensive simulation studies, we show that, even with partial deployment on the Internet, IDPFs can proactively limit the spoofing capability of attackers. In addition, they can help localize the origin of an attack packet to a small number of candidate networks.Keywords
IP Spoofing, DDoS, BGP, Network-Level Security and Protection, Routing Protocols.- Providing Data Security through LED Technique in Wireless Sensor Networks
Authors
1 Sri Indu College of Engg & Tech, IN
2 Dept of CSE, Sri Indu College of Engg & Tech., IN
Source
Networking and Communication Engineering, Vol 2, No 10 (2010), Pagination: 423-428Abstract
Providing desirable data security, that is, confidentiality, authenticity, and availability, in wireless sensor networks (WSNs) is challenging, as a WSN usually consists of a large number of resource constraint sensor nodes that are generally deployed in unattended/hostile environments and, hence, are exposed to many types of severe insider attacks due to node compromise.
Existing security designs mostly provide a hop-by-hop security paradigm and thus are vulnerable to such attacks. Furthermore, existing security designs are also vulnerable to many types of Denial of Service (DoS) attacks, such as report disruption attacks and selective forwarding attacks and thus put data availability at stake.
In this paper, we seek to overcome these vulnerabilities for large-scale static WSNs. We come up with a location-aware end-to-end security framework in which secret keys are bound to geographic locations and each node stores a few keys based on its own location. This location-aware property effectively limits the impact of compromised nodes only to their vicinity without affecting end-to-end data security.
The proposed multifunctional key management framework assures both node-to-sink and node-to-node authentication along the report forwarding routes. Moreover, the proposed data delivery approach guarantees efficient en-route bogus data filtering and is highly robust against DoS attacks. The evaluation demonstrates that the proposed design is highly resilient against an increasing number of compromised nodes and effective in energy savings.
Keywords
Sensors, Hop-By-Hop, Selective Forwarding Attack, Report Disruption.- Dynamic Routing with Security Considerations
Authors
1 CSE Dept. at Sri Indu College of Engineering & Technology, IN
2 Nagarjuna University, Guntur, IN
3 Sri Indu College of Engineering & Technology, JNTU Hyderabad, IN
Source
Networking and Communication Engineering, Vol 2, No 9 (2010), Pagination: 328-337Abstract
Security has become one of the major issues for data communication over wired and wireless networks. Different from the past work on the designs of cryptography algorithms and system infrastructures, we will propose a dynamic routing algorithm that could randomize delivery paths for data transmission. The algorithm is easy to implement and compatible with popular routing protocols, such as the Routing Information Protocol in wired networks and Destination-Sequenced Distance Vector protocol in wireless networks ,without introducing extra control messages. An analytic study on the proposed algorithm is presented, and a series of simulation experiments are conducted to verify the analytic results and to show the capability of the proposed algorithm.Keywords
Security-Enhanced Data Transmission, Dynamic Routing, RIP, DSDV.- Mitigating Denial-Of-Service Attacks on the Chord Overlay Network:A Location Hiding Approach
Authors
1 CSE Dept. at Sri Indu College of Engineering & Technology, IN
2 Nagarjuna University,Guntur, IN
3 Sri Indu College of Engineering & Technology, JNTU Hyderabad, IN
Source
Networking and Communication Engineering, Vol 2, No 9 (2010), Pagination: 338-354Abstract
Server less distributed computing has received significant attention from both the industry and the research community. Among the most popular applications are the wide-area network file systems, exemplified by CFS, Far site, and Ocean Store. These file systems store files on a large collection of entrusted nodes that form an overlay network. They use cryptographic techniques to maintain file confidentiality and integrity from malicious nodes. Unfortunately, cryptographic techniques cannot protect a file holder from a denial-of-service (DoS) attack or a host compromise attack. Hence, most of these distributed file systems are vulnerable to targeted file attacks, wherein an adversary attempts to attack a small (chosen) set of files by attacking the nodes that host them. This paper presents Location Guard—a location hiding technique for securing overlay file storage systems from targeted file attacks. Location Guard has three essential components: 1) location key, consisting of a random bit string (e.g., 128 bits) that serves as the key to the location of a file, 2) routing guard, a secure algorithm that protects accesses to a file in the overlay network given its location key such that neither its key nor its location is revealed to an adversary, and 3) a set of location inference guards, which refer to an extensible component of the Location Guard. Our experimental results quantify the overhead of employing Location Guard and demonstrate its effectiveness against DoS attacks, host compromise attacks, and various location inference attacks.Keywords
File Systems, Overlay Networks, Denial-Of-Service Attacks, Performance and Scalability, Location Hiding.- Web Application Protection from Wide Range of Web Vulnerabilities
Authors
1 Department of CSE at Sri Indu College of Engineering & Technology, IN
2 Computer Science and Engineering from Sri Indu College of Engineering & Technology, JNTU, Hyderabad, IN
Source
Data Mining and Knowledge Engineering, Vol 2, No 10 (2010), Pagination: 265-271Abstract
Adoption of web applications is increasing for multipurpose services. However, their correct functioning is mission critical for many businesses. At the same time, Web applications tend to be error prone and implementation vulnerabilities are readily and commonly exploited by attackers. The design of countermeasures that detect or prevent such vulnerabilities or protect against their exploitation is an important research challenge for the fields of software engineering and security engineering. In this paper we introduce a single J2EE based web application which can able to handle several vulnerabilities at application level, mainly these are related to injection types, cross site scripting, browser caching and also protecting the session data dependency via changing session identifier at runtime, sequential access and session expiration. By handling all these things together in an application we can protect our web application successfully from the common vulnerabilities.Keywords
Web Application, Vulnerabilities, Session Data, Security, Injection Flaw, Cross Site Scripting, Web Application Firewall (WAF).- Determine the Properties of Objects to Maximum Clearance
Authors
1 Department of computer science and Engineering, Sri Indu College of Engineering & Technology, IN
2 CSE Dept. at Sri Indu College of Engineering & Technology, IN
3 Sri Indu College of Engineering & Technology, IN
Source
Data Mining and Knowledge Engineering, Vol 2, No 10 (2010), Pagination: 272-286Abstract
In recent years, there has been significant interest in the development of ranking functions and efficient top-k retrieval algorithms to help users in ad hoc search and retrieval in databases (e.g., buyers searching for products in a catalog). We introduce a complementary problem: How to guide a seller in selecting the best attributes of a new tuple (e.g., a new product) to highlight so that it stands out in the crowd of existing competitive products and is widely visible to the pool of potential buyers. We develop several formulations of this problem. Although the problems are NP-complete, we give several exact and approximation algorithms that work well in practice. One type of exact algorithms is based on Integer Programming (IP) formulations of the problems. Another class of exact methods is based on maximal frequent item set mining algorithms. The approximation algorithms are based on greedy heuristics. A detailed performance study illustrates the benefits of our methods on real and synthetic data.Keywords
Data Mining, Knowledge and Data Engineering Tools and Techniques, Marketing, Mining Methods and Algorithms, Retrieval Models.- A Novel Approach in Extracting Medical Reports Using Mining Technique
Authors
1 Sri Indu College of Engineering and Technology, IN
Source
Data Mining and Knowledge Engineering, Vol 2, No 9 (2010), Pagination: 221-226Abstract
Medical text mining has gained increasing interest in recent years. Radiology reports contain rich information de- scribing radiologist's observations on the patient's medical conditions in the associated medical images. However, as most reports are in free text format, the valuable information contained in those reports cannot be easily accessed and used, unless proper text mining has been applied. In this paper, we propose a text mining system to extract and use the information in radiology reports. The system consists of three main modules: a medical finding extractor, a report and image retriever, and a text-assisted image feature extractor. In evaluation, the overall precision and re- call for medical finding extraction are 95.5% and 87.9% respectively, and for all modifiers of the medical findings 88.2% and 82.8% respectively. The overall result of report and image retrieval module and text-assisted image feature extraction module is satisfactory to radiologists.Keywords
Text Mining, Medical Finding Extractor, Report and Image Retriever, and Text-Assisted Image Feature Extractor.- Integrating E-Commerce & Data Mining Architecture Challenges
Authors
1 CSE Dept., Sri Indu College of Engg. & Tech., IN
2 Sri Indu College of Engg. & Tech., IN
3 ECE Dept., Sri Indu College of Engg. & Tech., IN
4 CSE Dept., Sri Indu Collg. of Engg. & Tech., IN
Source
Data Mining and Knowledge Engineering, Vol 1, No 7 (2009), Pagination: 351-358Abstract
We show that the e-commerce domain can provide all the right ingredients for successful data mining and claim that it is killer domain for data mining. We describe an integrated architecture based on our experience for supporting this integration. The architecture can dramatically reduce the pre-processing, cleaning and data understanding effort often documented to take 80% of the time in knowledge discovery projects. We emphasize the need for data collection at the application server layer (not the web server) in order to support logging of data and metadata that is essential to the discovery process. We describe the data transformation bridges required from the transaction processing systems and customer event streams (e. g. click streams) to the data warehouse. We detail the mining workbench, which needs to provide multiple views of the data through reporting, data mining algorithms, visualization and OLAP, We conclude with a set of challenges.Keywords
Data Mining, E-Commerce, Session Zing, OLAP, Sniffers.- Embedded Widget Design Implementation of TCP/IP Micro Stack for Real Time Industrial Constraints Monitoring and Control
Authors
1 CSE Dept., Sri Indu College of Engg. & Technology, IN
2 Sri Indu College of Engg. & Technology, IN